Equivalence of Labeled Markov Chains

نویسندگان

  • Laurent Doyen
  • Thomas A. Henzinger
  • Jean-François Raskin
چکیده

We consider the equivalence problem for labeled Markov chains (LMCs), where each state is labeled with an observation. Two LMCs are equivalent if every finite sequence of observations has the same probability of occurrence in the two LMCs. We show that equivalence can be decided in polynomial time, using a reduction to the equivalence problem for probabilistic automata, which is known to be solvable in polynomial time. We provide an alternative algorithm to solve the equivalence problem, which is based on a new definition of bisimulation for probabilistic automata. We also extend the technique to decide the equivalence of weighted probabilistic automata. Then, we consider the equivalence problem for labeled Markov decision processes (LMDPs), which asks given two LMDPs whether for every scheduler (i.e. way of resolving the nondeterministic decisions) for each of the processes, there exists a scheduler for the other process such that the resulting LMCs are equivalent. The decidability of this problem remains open. We show that the schedulers can be restricted to be observation-based, but may require infinite memory.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Metrics for Labeled Markov Systems

Partial Labeled Markov Chains are simultaneously generalizations of process algebra and of traditional Markov chains. They provide a foundation for interacting discrete probabilistic systems, the interaction being synchronization on labels as in process algebra. Existing notions of process equivalence are too sensitive to the exact probabilities of various transitions. This paper addresses cont...

متن کامل

Approximating Labeled Markov Processes

We study approximate reasoning about continuous-state labeled Markov processes. We show how to approximate a labeled Markov process by a family of finite-state labeled Markov chains. We show that the collection of labeled Markov processes carries a Polish space structure with a countable basis given by finite state Markov chains with rational probabilities. The primary technical tools that we d...

متن کامل

Stochastic equivalence for performance analysis of concurrent systems in dtsiPBC

We propose an extension with immediate multiactions of discrete time stochastic Petri Box Calculus (dtsPBC), presented by I.V. Tarasyuk. The resulting algebra dtsiPBC is a discrete time analogue of stochastic Petri Box Calculus (sPBC) with immediate multiactions, designed by H. Macià, V. Valero et al. within a continuous time domain. The step operational semantics is constructed via labeled pro...

متن کامل

Empirical Bayes Estimation in Nonstationary Markov chains

Estimation procedures for nonstationary Markov chains appear to be relatively sparse. This work introduces empirical  Bayes estimators  for the transition probability  matrix of a finite nonstationary  Markov chain. The data are assumed to be of  a panel study type in which each data set consists of a sequence of observations on N>=2 independent and identically dis...

متن کامل

A Definition Scheme for Quantitative Bisimulation

FuTS, state-to-function transition systems are generalizations of labeled transition systems and of familiar notions of quantitative semantical models as continuous-time Markov chains, interactive Markov chains, and Markov automata. A general scheme for the definition of a notion of strong bisimulation associated with a FuTS is proposed. It is shown that this notion of bisimulation for a FuTS c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Int. J. Found. Comput. Sci.

دوره 19  شماره 

صفحات  -

تاریخ انتشار 2008